1 00:00:01,400 --> 00:00:06,788 I want to talk a little bit about robots and human behavior; 2 00:00:06,953 --> 00:00:12,679 the major difference between cybernated organisms 3 00:00:13,150 --> 00:00:15,170 and human systems. 4 00:00:15,794 --> 00:00:18,944 A lot of people think that programming 5 00:00:19,162 --> 00:00:24,453 is exactly the same in people and robotics. 6 00:00:24,715 --> 00:00:26,527 It is not. 7 00:00:26,665 --> 00:00:31,326 The major difference is that you can design a robot 8 00:00:31,514 --> 00:00:35,380 to walk over, pick up an object and put it in another place 9 00:00:35,768 --> 00:00:38,893 but before the robot moves 10 00:00:39,037 --> 00:00:42,691 if you put the object in a place the robot was going to put it in 11 00:00:42,834 --> 00:00:46,929 it will still walk over and grab nothing in particular. 12 00:00:47,423 --> 00:00:50,776 Do you understand that? That's programmed. 13 00:00:50,916 --> 00:00:53,935 The difference between human systems and robots 14 00:00:54,079 --> 00:00:56,738 is that it's not linear. 15 00:00:57,341 --> 00:01:02,790 That means that the robot can do certain things that you program into it 16 00:01:02,947 --> 00:01:05,608 and if you look at that under a microscope 17 00:01:05,752 --> 00:01:10,176 you can see magnetic domains that will make the robot walk over 18 00:01:10,319 --> 00:01:13,525 to a given area and sit in a chair. 19 00:01:13,669 --> 00:01:15,577 If you pull the chair away 20 00:01:15,721 --> 00:01:19,210 the robot will walk over and sit on nothing and fall over. 21 00:01:19,354 --> 00:01:23,859 That's programmed. The human system differs considerably. 22 00:01:25,795 --> 00:01:28,262 When you work on a human being 23 00:01:28,416 --> 00:01:33,680 or a chimpanzee or any animal- 24 00:01:34,438 --> 00:01:36,758 I'll work with the chimp this time. 25 00:01:37,709 --> 00:01:41,341 Put the chimp in a big box 26 00:01:41,479 --> 00:01:45,045 and in that box are rods sticking out at different lengths 27 00:01:45,201 --> 00:01:49,600 with a cue (a circle, a triangle on one of them, different patterns) 28 00:01:50,300 --> 00:01:53,178 and you don't have to teach it anything. It'll walk in 29 00:01:53,328 --> 00:01:56,988 and sooner or later, it'll touch those things. 30 00:01:57,269 --> 00:01:59,625 When it touches any one of them 31 00:01:59,761 --> 00:02:03,359 water will come forth, touches another one, food. 32 00:02:03,559 --> 00:02:07,262 It touches another one and a soft bed comes out of the wall. 33 00:02:07,430 --> 00:02:11,099 If the animal is put there for a long enough time 34 00:02:11,250 --> 00:02:14,675 it will use those rods appropriately. 35 00:02:15,094 --> 00:02:18,140 Any animal has a range of behavior. 36 00:02:18,284 --> 00:02:22,032 When put in an environment, it doesn't respond like a robot. 37 00:02:22,626 --> 00:02:27,218 It looks at the environment and seeks reinforcement: food. 38 00:02:27,372 --> 00:02:30,484 If the leaves are circular where the food is 39 00:02:30,628 --> 00:02:33,215 it will go to the circular leaves. 40 00:02:33,359 --> 00:02:36,343 That's called 'associative memory'. 41 00:02:36,562 --> 00:02:40,166 Programmed computers have no associative memory. 42 00:02:40,304 --> 00:02:43,113 They follow a pattern. 43 00:02:43,760 --> 00:02:46,744 If you look at a phonograph record 44 00:02:46,895 --> 00:02:50,248 with a microscope, you'll see zigzags 45 00:02:50,423 --> 00:02:53,732 cut in the bakelite [vinyl] record. 46 00:02:54,064 --> 00:02:59,063 Those zigzags are representations of the voice of a person. 47 00:02:59,813 --> 00:03:02,091 While the record is playing 48 00:03:02,235 --> 00:03:06,370 if it's somebody singing 'Caruso' 49 00:03:06,539 --> 00:03:09,811 it can't deviate from those patterns. 50 00:03:10,286 --> 00:03:14,656 Robots that are programmed can't deviate from those programs 51 00:03:14,831 --> 00:03:17,985 unless you have path alongside it 52 00:03:20,077 --> 00:03:24,964 and you show variations, instead of a circle a slight ellipse. 53 00:03:25,626 --> 00:03:29,443 The animal will touch that thing 54 00:03:29,631 --> 00:03:32,659 thinking it's a circle because they're not that critical 55 00:03:32,812 --> 00:03:35,161 and it gets burned slightly 56 00:03:35,311 --> 00:03:38,684 so it will never touch that one again. 57 00:03:38,990 --> 00:03:41,772 That's what the animal has that the robot doesn't have. 58 00:03:41,912 --> 00:03:46,116 If the robot touches something and it doesn't reinforce it... 59 00:03:46,886 --> 00:03:49,369 How do you reinforce a robot? 60 00:03:49,513 --> 00:03:52,848 If he gets stung, that wouldn't bother him at all 61 00:03:53,748 --> 00:03:57,806 but a robot can learn to respond to different figures. 62 00:03:58,031 --> 00:04:02,936 When he sees a triangle, presses a button, he gets lubricated 63 00:04:03,492 --> 00:04:06,483 but he doesn't feel good when he gets lubricated 64 00:04:06,630 --> 00:04:10,284 so there's no reason to retain that action. 65 00:04:10,647 --> 00:04:13,271 It's only when a human touches something 66 00:04:13,425 --> 00:04:17,223 and they feel good touching it, that they repeat it. 67 00:04:17,375 --> 00:04:21,921 A robot cannot touch something and say "Hey, that feels good." 68 00:04:22,093 --> 00:04:25,559 They can reach out, do that and pull back 69 00:04:25,703 --> 00:04:29,525 but they can't make anything of it. Do you understand that? 70 00:04:30,117 --> 00:04:33,092 The reason I say "Do you understand that?" 71 00:04:33,232 --> 00:04:35,864 is because there's so much conflict today 72 00:04:36,004 --> 00:04:40,690 about robots and people: Will robots take over? 73 00:04:40,990 --> 00:04:43,881 Not if they're programmed not to take over. 74 00:04:44,328 --> 00:04:47,788 If they're programmed to take over, they can only shoot 75 00:04:47,931 --> 00:04:50,436 a guy in a certain uniform. 76 00:04:50,586 --> 00:04:54,180 If the guy stays put in a given area 77 00:04:54,330 --> 00:04:58,490 the robot can walk over, unless you condition the robot 78 00:04:58,647 --> 00:05:03,183 with the eyes to follow anything that moves and shoot it. 79 00:05:04,215 --> 00:05:08,711 Is a robot an assassin? No, it's programmed to shoot. 80 00:05:09,336 --> 00:05:12,813 That's quite different. Human beings have 81 00:05:12,956 --> 00:05:17,551 some people believe, 10 to 15 billion neurons. 82 00:05:17,803 --> 00:05:23,303 A robot has hundreds of thousands of associative sets, not billions. 83 00:05:23,669 --> 00:05:27,275 If you learn that a cup gives you water 84 00:05:27,432 --> 00:05:31,141 anything that looks like a cup might support water. 85 00:05:31,298 --> 00:05:35,095 We can deviate from our programming. 86 00:05:35,258 --> 00:05:39,155 Our programming appears rigid, but alongside of it 87 00:05:39,299 --> 00:05:43,397 is associative memory: I touched that and I felt pain. 88 00:05:43,622 --> 00:05:48,596 I touched the other thing and I didn't feel pain. I got something. 89 00:05:49,140 --> 00:05:53,858 A robot never looks at a thing and says "That's interesting!" 90 00:05:54,652 --> 00:05:57,509 If you were to float in midair in front of a robot 91 00:05:57,649 --> 00:06:00,699 it wouldn't say "Now that is interesting!" 92 00:06:00,846 --> 00:06:04,616 It can't do that, it can only do what it's programmed to do. 93 00:06:04,766 --> 00:06:08,413 A man can see things 94 00:06:08,576 --> 00:06:12,805 and be programmed and compare it to something else. 95 00:06:13,531 --> 00:06:18,100 Is that very clear, or do you want to question anything there? 96 00:06:20,219 --> 00:06:23,776 That's a major difference between programmed behavior 97 00:06:23,920 --> 00:06:26,038 and human programming. 98 00:06:26,188 --> 00:06:30,868 Humans have a lot of associations prior to programming 99 00:06:31,043 --> 00:06:34,774 so the other associations, if it reminds him of the other 100 00:06:34,930 --> 00:06:37,133 he can deviate. 101 00:06:37,306 --> 00:06:39,862 That's why people walk out of here when I speak 102 00:06:40,002 --> 00:06:42,774 with different interpretations. 103 00:06:42,914 --> 00:06:48,518 (Roxanne) Kurzweil talks about using nanotechnology and 104 00:06:48,667 --> 00:06:52,589 implanting something in the head 105 00:06:52,831 --> 00:06:57,335 like a second brain that may be so fast 106 00:06:57,561 --> 00:07:02,034 that it could take over the other brain in the person. 107 00:07:02,428 --> 00:07:05,844 This is something he's raised. - You can do that but 108 00:07:07,621 --> 00:07:13,301 it doesn't give [the robot] leverage like "I wonder about that." 109 00:07:14,124 --> 00:07:17,611 I've never seen that happen. A man could say that. 110 00:07:17,774 --> 00:07:20,004 A man could look at an event, and say 111 00:07:20,142 --> 00:07:23,479 "That's strange, the way that paper holds up that speaker." 112 00:07:23,623 --> 00:07:26,901 A robot does not do that. It looks at the speaker. 113 00:07:27,321 --> 00:07:31,247 It doesn't even look at it and say "That looks like a speaker." 114 00:07:31,415 --> 00:07:35,645 Unless you put a speaker in front of the robot and say 115 00:07:35,814 --> 00:07:39,612 "That's a speaker" so when its eye sees it, it says "That's a speaker." 116 00:07:39,753 --> 00:07:43,066 When you turn it sideways, it doesn't know what that is. 117 00:07:43,231 --> 00:07:46,487 When you turn it sideways and say "That's also a speaker." 118 00:07:46,627 --> 00:07:49,112 If you rotate the speaker in many positions 119 00:07:49,252 --> 00:07:52,471 so the robot has associations with the shape 120 00:07:52,621 --> 00:07:55,912 in different positions, he can call it a speaker. 121 00:07:56,196 --> 00:08:00,867 If you call an orange an orange, but if you cut it in half 122 00:08:01,029 --> 00:08:03,770 it can't call it an orange. 123 00:08:03,920 --> 00:08:08,424 It doesn't say "It looks like half an orange." Do you understand that? 124 00:08:09,332 --> 00:08:12,960 (Roxanne) This thing that Kurzweil is putting out like 'singularity' 125 00:08:13,100 --> 00:08:15,864 is when you start to implant things in people's heads 126 00:08:16,004 --> 00:08:19,595 that have so many more calculating ability, or... 127 00:08:19,739 --> 00:08:22,642 - If it is not connected to the other neurons 128 00:08:22,742 --> 00:08:26,129 it won't do anything. -It couldn't take over? -No 129 00:08:26,330 --> 00:08:28,726 unless it's connected. 130 00:08:28,873 --> 00:08:32,135 (Joel) He's proposing that it is connected somehow. 131 00:08:32,279 --> 00:08:36,172 There is some interface between... -If there is an interface 132 00:08:36,454 --> 00:08:40,633 organic neurons respond to certain rate of speed. 133 00:08:40,796 --> 00:08:44,850 Anything beyond that rate, it can't respond. 134 00:08:45,482 --> 00:08:49,952 Electronic systems travel almost at the speed of light. 135 00:08:50,102 --> 00:08:54,031 Neural associations are relatively slow. 136 00:08:55,930 --> 00:09:00,545 If you try to speed up digestion of food in a human 137 00:09:00,751 --> 00:09:04,320 the digestive acids flow at a certain rate. 138 00:09:04,470 --> 00:09:08,108 If you were to triple the rate, it might digest 139 00:09:08,287 --> 00:09:11,409 portions of the intestines. 140 00:09:11,565 --> 00:09:15,006 Do you understand what I mean? 141 00:09:15,191 --> 00:09:19,235 For instance, let's take a bear and put it in a room this size 142 00:09:19,379 --> 00:09:23,214 with a bunch of objects sticking out (they have to stick out) 143 00:09:23,408 --> 00:09:27,750 40 of them. A bear might learn how to use 144 00:09:27,932 --> 00:09:31,223 12 of them but not 40. 145 00:09:31,682 --> 00:09:36,118 It won't remember 40. It doesn't have the neuronal 146 00:09:37,210 --> 00:09:40,851 amount to remember... A bear could remember 147 00:09:41,258 --> 00:09:45,475 because when a bear walks through an environment "This bush has berries." 148 00:09:45,625 --> 00:09:47,921 He remembers where the bush is 149 00:09:48,071 --> 00:09:51,343 "This area has animals that I can eat." 150 00:09:51,512 --> 00:09:56,229 A bear can build up maybe thousands of associations 151 00:09:56,392 --> 00:09:59,817 but first you have to study the range of the animal. 152 00:09:59,974 --> 00:10:02,895 How many levers an animal can remember 153 00:10:03,042 --> 00:10:06,883 will tell you what its outside response will be. 154 00:10:07,121 --> 00:10:12,058 If you find a bear [that] can learn to work 47 levers 155 00:10:12,214 --> 00:10:16,944 (a little beyond that or a little less), if that's true, 156 00:10:17,529 --> 00:10:20,877 then you know what the bear can respond to in the environment: 157 00:10:21,017 --> 00:10:23,689 40 different systems. 158 00:10:23,923 --> 00:10:27,571 A human can generate associations 159 00:10:27,715 --> 00:10:30,999 with thousands of things in the environment. 160 00:10:32,882 --> 00:10:35,973 If a human learns to eat certain food 161 00:10:36,474 --> 00:10:38,613 and he has to climb a tree to get it 162 00:10:39,039 --> 00:10:42,811 if you put that food at the base of the tree he won't climb the tree. 163 00:10:42,955 --> 00:10:46,431 A robot will. Do you understand that? 164 00:10:46,571 --> 00:10:50,228 If you program a robot to climb a tree to get the apple 165 00:10:50,368 --> 00:10:52,081 it'll do that. 166 00:10:52,221 --> 00:10:54,944 If you put the apple on the ground, the robot goes "Ha!" 167 00:10:55,084 --> 00:10:57,205 And they simplify things. 168 00:10:57,355 --> 00:11:01,372 No, unless you build in something special in the robot 169 00:11:01,587 --> 00:11:04,453 to handle unforeseen variables 170 00:11:04,597 --> 00:11:08,119 and that's what people don't know how to do yet. 171 00:11:09,199 --> 00:11:14,600 They don't know how to program a robot to say "What have we here?" 172 00:11:15,196 --> 00:11:20,501 because the words "What have we here?" don't mean anything to a robot. 173 00:11:21,055 --> 00:11:24,108 It means something to a human being. 174 00:11:25,522 --> 00:11:29,029 (Roxanne) The roboticists and Kurzweil are bringing up 175 00:11:29,930 --> 00:11:33,928 when the robot does become connected to the environment 176 00:11:34,065 --> 00:11:36,620 and does have enough information that it would 177 00:11:36,760 --> 00:11:39,380 surpass and overtake people. 178 00:11:39,523 --> 00:11:42,349 - No, a robot does not 179 00:11:42,489 --> 00:11:45,585 ask questions. A robot does not say 180 00:11:45,814 --> 00:11:48,266 "I've been here before. I've seen that before." 181 00:11:48,404 --> 00:11:51,845 They don't have enough neurons to build 182 00:11:51,995 --> 00:11:54,523 all kinds of associations. 183 00:11:54,663 --> 00:11:56,929 (Roxanne) Do you think it's possible that they can do that 184 00:11:57,073 --> 00:12:01,415 if they use biological.... -If it's programmed, no. 185 00:12:02,391 --> 00:12:05,770 But if a robot is self-programming 186 00:12:06,689 --> 00:12:10,402 then the reason for a robot's actions 187 00:12:10,545 --> 00:12:13,363 are very different than human systems. 188 00:12:13,513 --> 00:12:16,866 When a human puts something into his mouth, it tastes good 189 00:12:17,251 --> 00:12:20,642 while if a robot does it, nothing. There's no reward. 190 00:12:20,999 --> 00:12:24,077 What's a reward to a robot? 191 00:12:24,208 --> 00:12:28,481 Sitting down does it say "Whew, I'm so tired 192 00:12:28,638 --> 00:12:31,635 and now I feel better." He doesn't feel. 193 00:12:31,775 --> 00:12:34,043 When he sits down, he doesn't say 194 00:12:34,184 --> 00:12:37,169 "It's good to have a chair in my area." 195 00:12:37,531 --> 00:12:41,804 He doesn't give a shit about those things. If the light gets so bright 196 00:12:41,961 --> 00:12:46,331 that the eyes of the robot turn off, he says "I can't see" 197 00:12:46,481 --> 00:12:50,541 (if you wire him that way) but he doesn't turn down the light 198 00:12:50,685 --> 00:12:53,538 unless you wire it that way. 199 00:12:53,710 --> 00:12:56,313 He turns down the light when it gets bright 200 00:12:56,457 --> 00:12:58,540 not because it's bright, but 201 00:12:58,681 --> 00:13:01,351 because he has senses that turn off the light. 202 00:13:02,300 --> 00:13:04,502 Do you understand that difference? 203 00:13:04,646 --> 00:13:08,143 (Roxanne) You were explaining this last night in terms of 204 00:13:08,638 --> 00:13:11,034 humans have to have experience to react... 205 00:13:11,184 --> 00:13:16,449 Yes, that means a robot doesn't seek experience. 206 00:13:16,931 --> 00:13:22,149 A robot doesn't want to know why some tires wear out faster than others. 207 00:13:22,289 --> 00:13:24,867 He doesn't take a microscope and look at the rubber 208 00:13:24,998 --> 00:13:29,875 under a microscope. A robot is not equipped that way. 209 00:13:30,044 --> 00:13:33,460 They don't have pleasure and pain. 210 00:13:33,747 --> 00:13:36,575 If they had pleasure and pain 211 00:13:36,725 --> 00:13:40,091 they would have preferences. 212 00:13:40,742 --> 00:13:43,986 Do you understand? If a robot cuts 213 00:13:44,136 --> 00:13:46,586 wood with a rotary saw: 214 00:13:46,730 --> 00:13:49,883 All the wood is shoved in there automatically; he cuts it. 215 00:13:50,243 --> 00:13:52,683 If you put a human in there, he'll cut him too. 216 00:13:52,820 --> 00:13:56,480 He doesn't think "That's a person, I don't want to cut that." 217 00:13:56,636 --> 00:14:00,678 The robot cannot do anything unless it's programmed to do it. 218 00:14:01,144 --> 00:14:04,260 It could be programmed to cut wood but it will cut anything else 219 00:14:04,410 --> 00:14:08,413 shoved in there. If you interfere with its programming 220 00:14:08,557 --> 00:14:12,687 it doesn't say "Wait a while, you're interfering with the programming!" 221 00:14:12,856 --> 00:14:16,003 If radar picks up fog in San Francisco 222 00:14:16,147 --> 00:14:19,294 the airplane might fly above the weather. 223 00:14:19,462 --> 00:14:22,299 The airplane moves up based on what's out there. 224 00:14:22,452 --> 00:14:26,879 If it's fog, it moves up, but it doesn't move up to avoid the fog! 225 00:14:27,029 --> 00:14:30,542 That's human projection. 226 00:14:30,711 --> 00:14:33,958 A robot moves up because there was fog ahead. 227 00:14:34,096 --> 00:14:37,850 Its senses bounce back and give it a thing 228 00:14:37,994 --> 00:14:41,629 that controls the elevators and makes it move up. 229 00:14:42,429 --> 00:14:45,855 If it's raining, the robot may open an umbrella 230 00:14:46,011 --> 00:14:50,078 above itself. But when the raindrops hit the umbrella 231 00:14:50,222 --> 00:14:52,840 they make contact with two terminals 232 00:14:52,984 --> 00:14:56,019 so there's a flow across. That opens the umbrella. 233 00:14:56,159 --> 00:14:59,355 But the robot says "It's raining. I'm going to get wet. I'll open it." 234 00:14:59,492 --> 00:15:02,747 No, none of that. Do you understand that? 235 00:15:02,948 --> 00:15:05,666 I'm talking about robotics today. 236 00:15:05,816 --> 00:15:09,242 When people say "Do you think robots will take over?" 237 00:15:09,392 --> 00:15:14,421 There's no basis for it if they're programmed a certain way.